Judge Tosses Biometric Data Suit Against X

Source: reason.com 6/19/24

X’s child porn detection system doesn’t violate an Illinois biometric privacy law, the judge ruled.

A federal judge dismissed a lawsuit concerning the software X (formerly Twitter) uses to find illegal porn images. The suit was brought by Mark Martell, who objected to X using Microsoft’s PhotoDNA software.

Martell argued that PhotoDNA—which is used across the tech industry to detect and report child porn—required the collection of biometric data and that this collection violated Illinois’ Biometric Information Privacy Act (BIPA).

A win for Martell could have imperiled the use of PhotoDNA and similar software by all sorts of tech companies, thwarting tools that have proved useful in fighting sexually explicit images of minors, non-consensual adult pornography (a.k.a. “revenge porn”), terroristic images, and extremely violent content. Tech companies voluntarily employ these tools in a way that seems minimally invasive to the privacy of your average user—no biometric data collection actually required.

So, while “dude loses biometric privacy suit against big tech” may seem on its surface like a sad story, it’s probably good news that U.S. District Judge Sunil R. Harjani granted X’s motion to dismiss the case.

Read the full article

 

Related posts

Subscribe
Notify of

We welcome a lively discussion with all view points - keeping in mind...

 

  1. Submissions must be in English
  2. Your submission will be reviewed by one of our volunteer moderators. Moderating decisions may be subjective.
  3. Please keep the tone of your comment civil and courteous. This is a public forum.
  4. Swear words should be starred out such as f*k and s*t and a**
  5. Please avoid the use of derogatory labels.  Always use person-first language.
  6. Please stay on topic - both in terms of the organization in general and this post in particular.
  7. Please refrain from general political statements in (dis)favor of one of the major parties or their representatives.
  8. Please take personal conversations off this forum.
  9. We will not publish any comments advocating for violent or any illegal action.
  10. We cannot connect participants privately - feel free to leave your contact info here. You may want to create a new / free, readily available email address that are not personally identifiable.
  11. Please refrain from copying and pasting repetitive and lengthy amounts of text.
  12. Please do not post in all Caps.
  13. If you wish to link to a serious and relevant media article, legitimate advocacy group or other pertinent web site / document, please provide the full link. No abbreviated / obfuscated links. Posts that include a URL may take considerably longer to be approved.
  14. We suggest to compose lengthy comments in a desktop text editor and copy and paste them into the comment form
  15. We will not publish any posts containing any names not mentioned in the original article.
  16. Please choose a short user name that does not contain links to other web sites or identify real people.  Do not use your real name.
  17. Please do not solicit funds
  18. No discussions about weapons
  19. If you use any abbreviation such as Failure To Register (FTR), Person Forced to Register (PFR) or any others, the first time you use it in a thread, please expand it for new people to better understand.
  20. All commenters are required to provide a real email address where we can contact them.  It will not be displayed on the site.
  21. Please send any input regarding moderation or other website issues via email to moderator [at] all4consolaws [dot] org
  22. We no longer post articles about arrests or accusations, only selected convictions. If your comment contains a link to an arrest or accusation article we will not approve your comment.
  23. If addressing another commenter, please address them by exactly their full display name, do not modify their name. 
ACSOL, including but not limited to its board members and agents, does not provide legal advice on this website.  In addition, ACSOL warns that those who provide comments on this website may or may not be legal professionals on whose advice one can reasonably rely.  
 

12 Comments
Inline Feedbacks
View all comments

Here we see a judge protecting big tech capitalism, the registry cottage industry and thus insuring that all the court and jailhouse doors remain squeak-free of rust.

This is the correct call by the court. It would be a technological absurdity to make it illegal to hash images and compare them against a list of known hashes. It’s a core piece of computing, and involves no biometrics, as the article points out.

Any computer file can be thought of as a binary string, a very long sequence of 1s and 0s. A cryptographic hash function (CHF) takes such a string and maps it to a fixed-length string (like 256 bits, which is very common) with some desirable properties for use as identification, such as being functionally impossible to reverse, a given output having a very low probability, it being highly unlikely two input files will have the same output (called a collision), etc. It’s not really a “secret signature” (to use the article’s phrase) because it’s not some secret extra data, but a basic mathematical consequence of how digital computing works. Hashing is widely used for file integrity monitoring, malware identification, etc.

Indeed, you can get an image’s hash right now, if you’re familiar with command-line usage. Open a terminal on macOS/Linux/Windows 11 (or powershell on Windows 10 or earlier), navigate to a directory with images, and run “sha256sum <filename>” (macOS/Linux) or “Get-FileHash <filename>” (Windows). This will give you the unique SHA256 hash for that image.

Notice how all personal data concerns brought on by big tech or the government is always packaged and presented as combating CP. Neat little trick to get people to overlook the potential (even likely) abuses of such procedures/software. And apparently a pretty effective one.

Come to think of it, shouldn’t those companies and the individuals that work in those specific departments (FBI included) be indicted for CP possession? How exactly are these companies/agents not “re-victimizing” those depicted if that is accomplished by simple viewing, as the public has been led to believe?

Crimes like CP distribution/viewing happen in privacy…you want privacy, this is the price. Combat CP by other means, or sacrifice all privacy to end it through universal, inescapable surveillance.